×
More About Commerical Content Moderators and Their Work
- Child exploitation is one of the only types of content that IT companies are legally required to remove from their platforms. Commercial content moderators characterize their work as being most effective in helping prevent child abuse and take pride in being able to keep the internet safer.
- Companies regularly audit moderators and expect their reviews to yield accuracy levels over 95%.
- Moderators often leave work feeling repulsed by humanity due to the unfathomable extent of human cruelty and utter depravity they encounter while reviewing content. Although their work may not impact them on a daily basis, they frequently experience intrusive thoughts or flashbacks about the disturbing images they have seen, particularly in situations that trigger these memories.
- Many moderators lack self-awareness in how they are impacted by their job, claiming that it does not have any ill effects on their general well-being. However, they do report increased alcohol consumption and avoidance in social situations.
- Due to the expectation of automated content moderation from advances in computer vision, there is a lack of advocacy for the labor rights of commercial content moderators, although labor union efforts are on the rise. However, the reality is that this type of work will reach full automation and will always require a person behind the screen.
Movements Towards Better Working Conditions
- Better pay and respect for the job rather than viewing it solely as a dirty one
- Elimination of or looser non-disclosure agreements so moderators can confide in others outside of their jobs about the work they do
- More efforts towards unionization
- Automation of work, although unlikely for the entirety of content moderation to be automated
- Increased support for mental health and well-being
References
B O O K S
- Behind the Screen: Content Moderation in the Shadows of Social Media by Sarah T. Roberts
A R T I C L E S
- "The Underworld of Online Content Moderation" by Isaac Chotiner (The New Yorker)
- "Social Media’s Silent Filter" by Sarah T. Roberts (The Atlantic)
- "Content Moderators at YouTube, Facebook and Twitter See the Worst of the Web — and Suffer Silently" by Elizabeth Dwoskin, Jeanne Whalen, and Regine Cabato (The Washington Post)
- "Content Moderation Is Terrible by Design" by Thomas Stackpole (Harvard Business Review)
- "The Complex Debate Over Silicon Valley’s Embrace of Content Moderation" by Nellie Bowles (The New York Times)
- "The Hardest Job in Silicon Valley is a Living Nightmare" by Mark Wilson (Fast Company)
V I D E O S
- Content Moderators: The Gatekeepers of Social Media by Gianluca Demartini
- The Price of a "Clean" Internet by Hans Block and Moritz Riesewieck